54
6
The Nature of Information
where normal d upper QdQ is the heat that flows into the system. In thermodynamics, the internal
energyupper EE of a system is formally defined by the First Law as the difference between
the heat and normal d upper WdW, the work done by the system:
upper S equals k Subscript normal upper B Baseline ln upper W commadE = dQ −dW .
(6.9)
The only way that a system can absorb heat without raising its temperature is by
becoming more disordered. Hence, entropy is a measure of disorder. Starting from a
microscopic viewpoint, entropy is given by the famous formula inscribed on Boltz-
mann’s tombstone:
upper S equals k Subscript normal upper B Baseline ln upper W commaS = kB ln W ,
(6.10)
wherek Subscript normal upper BkB is his constant andupper WW is the number of (micro)states available to the system.
Note that reducing the number of states reduces the disorder. Information amount-
ing to log Subscript 2 Baseline upper Wlog2 W bits is required to specify one particular microstate, assuming that
all microstates have the same probability of being occupied, according to Hartley’s
formula; the specification of a particular microstate removes that amount of uncer-
tainty. Thermodynamical entropy defined by Eq. (6.8), statistical mechanical entropy
(6.10), and the Hartley or Shannon index only differ from each other by numerical
constants.
Although the set of positions and momenta of the molecules in a gas at a given
instant can thus be considered as information, within a microscopic interval (between
atomic collisions, of the order of 0.1 ps) this set is forgotten and another set is real-
ized. The positions and momenta constitute microscopic information; the quantity of
macroscopic (remembered) information is zero. In general, the quantity of macroin-
formation is far less than the quantity of (forgotten) microinformation, but the former
is far more valuable. 5
In the world of engineering, this state of affairs has of course always been recog-
nized. One does not need to know the temperature (within reason!) in order to design
a bridge or a mechanism. The essential features of any construction are found in a
few large-scale correlated motions; the vast number of uncorrelated, thermal degrees
of freedom are generally unimportant.
Symbol and Word Entropies. The Shannon index (6.5) gives the average infor-
mation per symbol; an analogous quantity upper I Subscript nIn can be defined for the probability of
nn-mers (nn-symbol “words”), whence the differential entropy upper I overTilde Subscript n ˜In,
upper I overTilde Subscript n Baseline equals upper I Subscript n plus 1 Baseline minus upper I Subscript n Baseline comma ˜In = In+1 −In ,
(6.11)
5 “Forgetting” implies decay of information; what does “remembering” mean? It means to bring a
system to a defined stable state (i.e., one of two or more states), and the system can only switch to
another state under the influence of an external impulse. The physical realization of such systems
implies a minimum of several atoms; as a rule a single atom, or a simple small molecule, can
exist in only one stable state. Among the smallest molecules fulfilling this condition are sugars
and amino acids, which can exist in left- and right-handed chiralities. Note that many biological
macromolecules and supramolecular assemblies can exist in several stable states.